|
In the theory of vector spaces the concept of linear dependence and linear independence of the vectors in a subset of the vector space is central to the definition of dimension. A set of vectors is said to be linearly dependent if one of the vectors in the set can be defined as a linear combination of the other vectors. If no vector in the set can be written in this way, then the vectors are said to be linearly independent.〔G. E. Shilov, ''Linear Algebra'' (Trans. R. A. Silverman), Dover Publications, New York, 1977.〕 A vector space can be of finite dimension or infinite dimension depending on the number of linearly independent basis vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space are linearly dependent are central to determining a set of basis vectors for a vector space. == Definition == The vectors in a subset ''S''= of a vector space ''V'' are said to be ''linearly dependent'', if there exist a ''finite'' number of ''distinct'' vectors ''v''1, ''v''2, ..., ''vk'' in ''S'' and scalars ''a''1, ''a''2, ..., ''ak'', not all zero, such that : where zero denotes the zero vector. Notice that if not all of the scalars are zero, then at least one is non-zero, say ''a''1, in which case this equation can be written in the form : Thus, v1 is shown to be a linear combination of the remaining vectors. The vectors in a set ''T''= are said to be ''linearly independent'' if the equation : can only be satisfied by ''a''i=0 for i=1,..., n. This implies that no vector in the set can be represented as a linear combination of the remaining vectors in the set. In other words, a set of vectors is linearly independent if the only representations of 0 as a linear combination of its vectors is the trivial representation in which all the scalars ''a''i are zero. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Linear independence」の詳細全文を読む スポンサード リンク
|